Randomized Douglas-Rachford Splitting Algorithms for Federated Composite Optimization

Quoc Tran-Dinh (University of North Carolina)

29-Sep-2021, 01:00-02:00 (4 years ago)

Abstract: In this talk, we present two randomized Douglas-Rachford splitting algorithms to solve a class of composite nonconvex finite-sum optimization problems arising from federated learning. Our algorithms rely on a combination of three main techniques: Douglas-Rachford splitting scheme, randomized block-coordinate technique, and asynchronous strategy. We show that our algorithms achieve the best-known communication complexity bounds under standard assumptions in the nonconvex setting, while allow one to inexactly updating local models with only a subset of users each round, and handle nonsmooth convex regularizers. Our second algorithm can be implemented in an asynchronous mode using a general probabilistic model to capture different computational architectures. We illustrate our algorithms with many numerical examples and show that the new algorithms have a promising performance compared to common existing methods.

This talk is based on the collaboration with Nhan Pham (UNC), Lam M. Nguyen (IBM), and Dzung Phan (IBM).

optimization and control

Audience: researchers in the topic


Variational Analysis and Optimisation Webinar

Series comments: Register on www.mocao.org/va-webinar/ to receive information about the zoom connection.

Organizers: Hoa Bui*, Matthew Tam*, Minh Dao, Alex Kruger, Vera Roshchina*, Guoyin Li
*contact for this listing

Export talk to